Optimal subspaces and constrained principal component analysis
نویسندگان
چکیده
منابع مشابه
Ela Optimal Subspaces and Constrained Principal Component Analysis
The main result of this article allows formulas of analytic geometry to be elegantly unified, by readily providing parametric as well as cartesian systems of equations. These systems characterize affine subspaces in Rp passing through specified affine subspaces of lower dimension. The problem solved is closely related to constrained principal component analysis. A few interesting applications a...
متن کاملConstrained generalised principal component analysis
Generalised Principal Component Analysis (GPCA) is a recently devised technique for fitting a multicomponent, piecewise-linear structure to data that has found strong utility in computer vision. Unlike other methods which intertwine the processes of estimating structure components and segmenting data points into clusters associated with putative components, GPCA estimates a multi-component stru...
متن کاملCone-Constrained Principal Component Analysis
Estimating a vector from noisy quadratic observations is a task that arises naturally in many contexts, from dimensionality reduction, to synchronization and phase retrieval problems. It is often the case that additional information is available about the unknown vector (for instance, sparsity, sign or magnitude of its entries). Many authors propose non-convex quadratic optimization problems th...
متن کاملOptimal sparse L1-norm principal-component analysis
We present an algorithm that computes exactly (optimally) the S-sparse (1≤S<D) maximum-L1-norm-projection principal component of a real-valued data matrix X ∈ RD×N that contains N samples of dimension D. For fixed sample support N , the optimal L1-sparse algorithm has linear complexity in data dimension, O (D). For fixed dimension D (thus, fixed sparsity S), the optimal L1-sparse algorithm has ...
متن کاملOptimal Solutions for Sparse Principal Component Analysis
Given a sample covariance matrix, we examine the problem of maximizing the variance explained by a linear combination of the input variables while constraining the number of nonzero coefficients in this combination. This is known as sparse principal component analysis and has a wide array of applications in machine learning and engineering. We formulate a new semidefinite relaxation to this pro...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: The Electronic Journal of Linear Algebra
سال: 2003
ISSN: 1081-3810
DOI: 10.13001/1081-3810.1107